Goto

Collaborating Authors

 online and private learnability


On the Equivalence between Online and Private Learnability beyond Binary Classification

Neural Information Processing Systems

Alon et al. [2019] and Bun et al. [2020] recently showed that online learnability and private PAC learnability are equivalent in binary classification. We investigate whether this equivalence extends to multi-class classification and regression. First, we show that private learnability implies online learnability in both settings. Our extension involves studying a novel variant of the Littlestone dimension that depends on a tolerance parameter and on an appropriate generalization of the concept of threshold functions beyond binary classification. Second, we show that while online learnability continues to imply private learnability in multi-class classification, current proof techniques encounter significant hurdles in the regression setting. While the equivalence for regression remains open, we provide non-trivial sufficient conditions for an online learnable class to also be privately learnable.


Review for NeurIPS paper: On the Equivalence between Online and Private Learnability beyond Binary Classification

Neural Information Processing Systems

Summary and Contributions: Update following the the author response: I thank the authors for the clarifications. This paper explores the connection between learning with approximate privacy and online learning for multi-class and regression problems. It follows prior work that showed an equivalence between approximate privacy and online learning for binary classification. This paper has two main results: (1) Multi-class: Consider a hypothesis class H of functions from X to Y where Y is finite. Then, H is PAC learnable with respect to the 0-1 loss in the online setting with if and only if it is learnable with approximate privacy in the standard stochastic batch setting.


Review for NeurIPS paper: On the Equivalence between Online and Private Learnability beyond Binary Classification

Neural Information Processing Systems

This is overall a good work that technically relies on previous work. It contributes to our understanding of the relation between online learnability and privacy.


On the Equivalence between Online and Private Learnability beyond Binary Classification

Neural Information Processing Systems

Alon et al. [2019] and Bun et al. [2020] recently showed that online learnability and private PAC learnability are equivalent in binary classification. We investigate whether this equivalence extends to multi-class classification and regression. First, we show that private learnability implies online learnability in both settings. Our extension involves studying a novel variant of the Littlestone dimension that depends on a tolerance parameter and on an appropriate generalization of the concept of threshold functions beyond binary classification. Second, we show that while online learnability continues to imply private learnability in multi-class classification, current proof techniques encounter significant hurdles in the regression setting.